42 research outputs found

    Does visual experience influence arm proprioception and its lateralization? Evidence from passive matching performance in congenitally-blind and sighted adults

    Full text link
    In humans, body segments' position and movement can be estimated from multiple senses such as vision and proprioception. It has been suggested that vision and proprioception can influence each other and that upper-limb proprioception is asymmetrical, with proprioception of the non-dominant arm being more accurate and/or precise than proprioception of the dominant arm. However, the mechanisms underlying the lateralization of proprioceptive perception are not yet understood. Here we tested the hypothesis that early visual experience influences the lateralization of arm proprioceptive perception by comparing 8 congenitally-blind and 8 matched, sighted right-handed adults. Their proprioceptive perception was assessed at the elbow and wrist joints of both arms using an ipsilateral passive matching task. Results support and extend the view that proprioceptive precision is better at the non-dominant arm for blindfolded sighted individuals. While this finding was rather systematic across sighted individuals, proprioceptive precision of congenitally-blind individuals was not lateralized as systematically, suggesting that lack of visual experience during ontogenesis influences the lateralization of arm proprioception

    Do Visual and Vestibular Inputs Compensate for Somatosensory Loss in the Perception of Spatial Orientation? Insights from a Deafferented Patient

    Get PDF
    Bringoux L, Scotto di Cesare C, Borel L, Macaluso T, Sarlegna FR. Do Visual and Vestibular Inputs Compensate for Somatosensory Loss in the Perception of Spatial Orientation? Insights from a Deafferented Patient. Frontiers in Human Neuroscience. 2016;10: 181.The present study aimed at investigating the consequences of a massive loss of somatosensory inputs on the perception of spatial orientation. The occurrence of possible compensatory processes for external (i.e., object) orientation perception and self-orientation perception was examined by manipulating visual and/or vestibular cues. To that aim, we compared perceptual responses of a deafferented patient (GL) with respect to age-matched Controls in two tasks involving gravity-related judgments. In the first task, subjects had to align a visual rod with the gravitational vertical (i.e., Subjective Visual Vertical: SVV) when facing a tilted visual frame in a classic Rod-and-Frame Test. In the second task, subjects had to report whether they felt tilted when facing different visuo-postural conditions which consisted in very slow pitch tilts of the body and/or visual surroundings away from vertical. Results showed that, much more than Controls, the deafferented patient was fully dependent on spatial cues issued from the visual frame when judging the SVV. On the other hand, the deafferented patient did not rely at all on visual cues for self-tilt detection. Moreover, the patient never reported any sensation of tilt up to 18 degrees contrary to Controls, hence showing that she did not rely on vestibular (i.e., otoliths) signals for the detection of very slow body tilts either. Overall, this study demonstrates that a massive somatosensory deficit substantially impairs the perception of spatial orientation, and that the use of the remaining sensory inputs available to a deafferented patient differs regarding whether the judgment concerns external vs. self-orientation

    Somatosensory Loss Influences the Adoption of Self-Centered Versus Decentered Perspectives

    Get PDF
    The body and the self are commonly experienced as forming a unity. Experiencing the external world as distinct from the self and the body strongly relies on adopting a single self-centered perspective which results in integrating multisensory sensations into one egocentric body-centered reference frame. Body posture and somatosensory representations have been reported to influence perception and specifically the reference frame relative to which multisensory sensations are coded. In the study reported here, we investigated the role of somatosensory and visual information in adopting self-centered and decentered spatial perspectives. Two deafferented patients who have neither tactile nor proprioceptive perception below the head and a group of age-matched control participants performed a graphesthesia task, consisting of the recognition of ambiguous letters (b, d, p, and q) drawn tactilely on head surfaces. To answer which letter was drawn, the participants can adopt either a self-centered perspective or a decentered one (i.e., centered on a body part or on an external location). The participants’ responses can be used, in turn, to infer the way the left-right and top-bottom letters’ axes are assigned with respect to the left-right and top-bottom axes of their body. In order to evaluate the influence of body posture, the ambiguous letters were drawn on the participants’ forehead, left, and right surfaces of the head, with the head aligned or rotated in yaw relative to the trunk. In order to evaluate the role of external information, the participants completed the task with their eyes open in one session and closed in another one. The results obtained in control participants revealed that their preferred perspective varied with body posture but not with vision. Different results were obtained with the deafferented patients who overall do not show any significant effect of their body posture on their preferred perspective. This result suggests that the orientation of their self is not influenced by their physical body. There was an effect of vision for only one of the two patients. The deafferented patients rely on strategies that are more prone to interindividual differences, which highlights the crucial role of somatosensory information in adopting self-centered spatial perspectives

    Proprioceptive loss and the perception, control and learning of arm movements in humans: evidence from sensory neuronopathy

    Get PDF
    © 2018 The Author(s) It is uncertain how vision and proprioception contribute to adaptation of voluntary arm movements. In normal participants, adaptation to imposed forces is possible with or without vision, suggesting that proprioception is sufficient; in participants with proprioceptive loss (PL), adaptation is possible with visual feedback, suggesting that proprioception is unnecessary. In experiment 1 adaptation to, and retention of, perturbing forces were evaluated in three chronically deafferented participants. They made rapid reaching movements to move a cursor toward a visual target, and a planar robot arm applied orthogonal velocity-dependent forces. Trial-by-trial error correction was observed in all participants. Such adaptation has been characterized with a dual-rate model: a fast process that learns quickly, but retains poorly and a slow process that learns slowly and retains well. Experiment 2 showed that the PL participants had large individual differences in learning and retention rates compared to normal controls. Experiment 3 tested participants’ perception of applied forces. With visual feedback, the PL participants could report the perturbation’s direction as well as controls; without visual feedback, thresholds were elevated. Experiment 4 showed, in healthy participants, that force direction could be estimated from head motion, at levels close to the no-vision threshold for the PL participants. Our results show that proprioceptive loss influences perception, motor control and adaptation but that proprioception from the moving limb is not essential for adaptation to, or detection of, force fields. The differences in learning and retention seen between the three deafferented participants suggest that they achieve these tasks in idiosyncratic ways after proprioceptive loss, possibly integrating visual and vestibular information with individual cognitive strategies

    La main vers la cible : intégration multi-sensorielle et contrôle en ligne du mouvement de pointage

    No full text
    Reaching for a target with the hand : Multi-sensory integration and online control of arm movements The central nervous system, the interface between sensory and motor systems, allows us to interact with our environment. This review presents a critical analysis of the current knowledge on the neuropsychological processes underlying the online control of goaldirected arm movements toward visual targets. While we live within an uncertain environment, the visual and proprioceptive systems are continuously used to update body and space representations, in order to control our movements. However, it is still not well understood how hand position is determined : is the hand localization process based on visual, proprioceptive, efferent signals ? It has been shown that visual, proprioceptive and internal feedback loops can be used to control in flight fast reaching movements. Recent results support the idea that peripheral and central feedback signals are integrated in an optimal fashion to control the movements. This review highlights the rapidity of information processing and discusses the specificity of the online motor control as a function of the task constraints, before suggesting new lines of research.Le système nerveux, l’interface entre les systèmes sensoriels et moteurs, permet aux êtres vivants d’interagir avec leur environnement. Cette revue propose une synthèse critique des travaux sur les processus neuropsychologiques sous-tendant le contrôle d’un mouvement d’atteinte manuelle durant sa réalisation. Alors que nous vivons dans un environnement incertain, les sens visuels et proprioceptifs sont continuellement utilisés pour mettre à jour les représentations du corps et de l’espace et contrôler nos mouvements dirigés vers une cible. Le cas de la localisation de la main reste cependant largement débattu (vision, proprioception, copie d’efférences ?). De récents résultats expérimentaux suggèrent que c’est la capacité du système nerveux à intégrer rapidement les informations visuelles et proprioceptives qui, couplée à sa faculté de prédiction, permet d’optimiser notre comportement moteur.Sarlegna Fabrice. La main vers la cible : intégration multi-sensorielle et contrôle en ligne du mouvement de pointage. In: L'année psychologique. 2007 vol. 107, n°2. pp. 303-336

    La main vers la cible : intégration multi-sensorielle et contrôle en ligne du mouvement de pointage

    No full text
    Reaching for a target with the hand : Multi-sensory integration and online control of arm movements The central nervous system, the interface between sensory and motor systems, allows us to interact with our environment. This review presents a critical analysis of the current knowledge on the neuropsychological processes underlying the online control of goaldirected arm movements toward visual targets. While we live within an uncertain environment, the visual and proprioceptive systems are continuously used to update body and space representations, in order to control our movements. However, it is still not well understood how hand position is determined : is the hand localization process based on visual, proprioceptive, efferent signals ? It has been shown that visual, proprioceptive and internal feedback loops can be used to control in flight fast reaching movements. Recent results support the idea that peripheral and central feedback signals are integrated in an optimal fashion to control the movements. This review highlights the rapidity of information processing and discusses the specificity of the online motor control as a function of the task constraints, before suggesting new lines of research.Le système nerveux, l’interface entre les systèmes sensoriels et moteurs, permet aux êtres vivants d’interagir avec leur environnement. Cette revue propose une synthèse critique des travaux sur les processus neuropsychologiques sous-tendant le contrôle d’un mouvement d’atteinte manuelle durant sa réalisation. Alors que nous vivons dans un environnement incertain, les sens visuels et proprioceptifs sont continuellement utilisés pour mettre à jour les représentations du corps et de l’espace et contrôler nos mouvements dirigés vers une cible. Le cas de la localisation de la main reste cependant largement débattu (vision, proprioception, copie d’efférences ?). De récents résultats expérimentaux suggèrent que c’est la capacité du système nerveux à intégrer rapidement les informations visuelles et proprioceptives qui, couplée à sa faculté de prédiction, permet d’optimiser notre comportement moteur.Sarlegna Fabrice. La main vers la cible : intégration multi-sensorielle et contrôle en ligne du mouvement de pointage. In: L'année psychologique. 2007 vol. 107, n°2. pp. 303-336

    CONTROLE EN LIGNE DES MOUVEMENTS D'ATTEINTE MANUELLE DE CIBLE: CONTRIBUTION DES INFORMATIONS DE LOCALISATION DE LA MAIN ET DE LA CIBLE

    No full text
    Dr Jean Blouin Advisor Pr Yann Coello Examinateur Dr Michel Desmurget Rapporteur Pr Vincent Nougier Rapporteur Pr Jean-Jacques Temprado President of the board Dr Robert van Beers ExaminateurDespite extensive research in the field of motor control, it is still not fully understood how visual and proprioceptive information related to hand and target positions are used to plan and control goal-directed arm movements. We developed an original method to investigate the online control of movement. In each of the studies realised, adult participants were asked to reach as accurately as possible visual targets. Modifications of either target position or seen hand position were triggered randomly, near movement initiation, and we analysed the kinematics of the rapid reaching movements. To reduce the putative role of cognitive, high-level processes on the online control of movement, we produced these perturbations such that they were not perceived by the subjects (i.e. these were never able to report verbally such events). Study 1 demonstrated that visual information of target position was taken into account earlier and greater compared to visual information of hand position to control in real time the rapid reaching movements. The limited use of hand visual feedback in this study appeared to be conflicting with a number of reports which emphasized its crucial contribution to the control of aimed arm movements. A detailed analysis of the literature suggested that the nature of the task, essentially a task where movement amplitude had to be controlled, could be responsible for the limited processing of visual information. The second study was designed to examine the use of hand visual feedback in the online control of rapid reaching movements, when only movement direction has to be controlled. Study 2 highlighted the ability of the nervous system to process rapidly and accurately visual and proprioceptive information on hand position. Indeed, visual information was used significantly and consistently, a result differing strikingly from that in the first study. It then appeared interesting to investigate the online control of reaching movements when modifications of hand visual feedback should affect both movement amplitude and direction. The underlying question was to know whether the online control of both components would be identical to what was observed in the two previous studies or whether we would observe an interaction-type effect ? Study 3 showed that the directional control of rapid movements on the basis of hand visual feedback is limited by the control of movement amplitude. Therefore, the requirement to stop accurately the rapid hand movement on the target by controlling braking (antagonist) activity can be legitimately termed as a constraint. At this point, we had shown significant contributions of proprioceptive information and of visual signals related to hand and target localisations to the control of reaching movement during its execution. To further understand the crucial role of these feedback loops, we analysed the performance of a proprioceptively deafferented patient in complete darkness, i.e. without vision of the hand and the target. Study 4 demonstrated that reaching movements can be controlled in-flight despite the absence of peripheral sensory feedback loops on hand and target positions during the movement. We thus propose a model of motor control based on the optimal use of feedforward and feedback mechanisms and on the continuity and the rapidity of information processing.Dans le champ des neurosciences comportementales, une interrogation demeure quant à la contribution des informations visuelles et proprioceptives permettant de localiser la main et la cible pendant la réalisation d'un mouvement de la main vers une cible visuelle. Nous avons développé une méthode originale pour investiguer la thématique du contrôle en ligne du mouvement. Dans chacune des études réalisées, des participants adultes essayaient d'atteindre le plus précisément possible des cibles visuelles présentées dans l'obscurité. Des modifications de la position de la cible mais également de la position vue de la main étaient réalisées de façon aléatoire et nous avons analysé comment cela affectait le comportement moteur des sujets. Pour que l'influence de processus cognitifs sur le contrôle en ligne du mouvement soit aussi similaire que possible dans les conditions avec ou sans perturbation, nous avons choisi de produire ces perturbations sans que les sujets ne puissent les percevoir dans le sens où ils n'ont jamais été capables d'en rendre compte verbalement. L'étude 1 a montré que l'information visuelle relative à la position de la cible était prise en compte de manière plus importante et plus rapide par rapport à l'information visuelle relative à la position de la main, pour le contrôle en temps réel des mouvements rapides. Toutefois, la principale question dégagée de cette étude a concerné l'utilisation limitée du feedback visuel de la main, une observation qui semblait en opposition avec nombre d'études ayant montré le rôle essentiel de ce type d'information dans le contrôle de mouvements d'atteinte manuelle. En fait, une analyse approfondie de la littérature a laissé suggérer que la nature de la tâche, où l'amplitude du mouvement devait être contrôlée, pourrait avoir limité le traitement optimal de l'information visuelle. L'étude 2 avait pour but d'examiner l'utilisation du feedback visuel de la main dans le contrôle de mouvements d'atteinte manuelle lorsque seule la composante directionnelle du mouvement est à contrôler. L'étude 2 a permis de mettre en évidence la capacité d'intégration rapide et précise du système nerveux central des informations visuelles et proprioceptives sur la position de la main. L'information visuelle a en effet été utilisée de manière consistante et significative, un résultat différant singulièrement de celui obtenu dans l'étude 1 où essentiellement l'amplitude du mouvement était à contrôler. Dès lors, il nous a paru intéressant d'étudier une tâche où des modifications de l'information visuelle devaient résulter en des modifications de l'amplitude et de la direction du mouvement. La question sous-jacente était de savoir si le contrôle des deux composantes serait identique aux deux expériences précédentes ou si il y aurait un effet de type interaction au niveau du contrôle. L'étude 3 a notamment permis de démontrer que le contrôle directionnel de mouvements rapides sur la base d'informations visuelles de la main est limité par le contrôle de l'amplitude (que l'on peut alors légitimement définir comme une contrainte). En fait, le fait de contrôler l'amplitude (et donc le freinage du mouvement) semble limiter plus généralement le contrôle en ligne du mouvement sur la base des informations de localisation de la main. A ce stade, nous avions donc démontré la contribution des informations visuelles de localisation de la cible et de la main, ainsi que des informations proprioceptives au contrôle courant des mouvements d'atteinte manuelle. Nos expériences ont donc permis de démontrer les rôles essentiels des informations proprioceptives et visuelles (relatives à la position de la main et de la cible). Afin d'étudier le caractère indispensable de ces boucles de rétroaction, nous avons eu l'opportunité de travailler avec une patiente proprioceptivement désafférentée. L'étude 4 nous a permis de montrer qu'un contrôle du geste en cours d'exécution était possible malgré l'absence de boucles de rétroaction sensorielles périphériques sur la localisation de la main et de la cible. Nous proposons donc un modèle du contrôle en ligne du mouvement basé sur une utilisation optimale des mécanismes proactifs et rétroactifs, ainsi que sur la continuité et la rapidité du processus de traitement de l'information

    Visual guidance of arm reaching: Online adjustments of movement direction are impaired by amplitude control

    No full text
    International audienc
    corecore